Goto

Collaborating Authors

 evolving neural network


Evolving Neural Networks in JAX

#artificialintelligence

"So why should I switch from insert-autodiff-library to JAX?". Here is my answer: JAX is not simply a fast library for automatic differentiation. If your scientific computing project wants to benefit from XLA, JIT-compilation and the bulk-array programming paradigm -- then JAX provides a wonderful API. While PyTorch relies on pre-compiled kernels and fast C code for most common Deep Learning applications, JAX allows us to leverage a high-level interface for programming your favorite accelerators. But this is not restricted to standard gradient-based optimization setups.


Evolving Neural Networks

#artificialintelligence

In the __init__ function, we set up the network. The parameter dimensions is a list of layer dimensions, where the first is the width of the input, the last is the width of the output, and all others are hidden dimensions. The __init__ function iterates through these n dimensions to create n-1 weight matrices using Glorot Normal initialization, which are stored as layers. If bias is enabled, a non-zero bias vector is also stored for each layer. The model uses ReLU activation for all internal layers.